# Code Understanding
GECKO 7B
Apache-2.0
GECKO is a 7-billion-parameter decoder-only Transformer model trained on Korean, English, and code, released under the Apache 2.0 license.
Large Language Model
Transformers Supports Multiple Languages

G
kifai
43
12
Cubert 20210711 Python 512
Apache-2.0
CuBERT is a context-aware embedding model for Python source code, pre-trained based on the BERT architecture, designed for code understanding and analysis tasks.
Large Language Model
Transformers Other

C
claudios
26
1
Codet5 Base Codexglue Sum Python
Bsd-3-clause
This is a code summarization generation model based on the CodeT5-base architecture, fine-tuned using the Python portion of the CodeXGLUE dataset.
Text Generation
Transformers

C
Salesforce
58
8
Featured Recommended AI Models